Medicine is the science and art (ars medicina) of healing humans. It includes a variety of health care practices evolved to maintain and restore health by the prevention and treatment of illness. Before scientific medicine, healing arts were practiced along with alchemical and ritual practices that developed out of religious and cultural traditions. The term "Western medicine" was until recently used to refer to scientific and science-based practices to distinguish it from "Eastern medicine" — which are typically based in traditional, story-told, or otherwise non-scientific practices.
Contemporary medicine applies health science, biomedical research, and medical technology to diagnose and treat injury and disease, typically through medication, surgery, or some other form of therapy. The word medicine is derived from the Latin ars medicina, meaning the art of healing.[1][2]
Though medical technology and clinical expertise are pivotal to contemporary medicine, successful face-to-face relief of actual suffering continues to require the application of ordinary human feeling and compassion, known in English as bedside manner.[3]
Contents |
Prehistoric medicine incorporated plants (herbalism), animal parts and minerals. In many cases these materials were used ritually as magical substances by priests, shamans, or medicine men. Well-known spiritual systems include animism (the notion of inanimate objects having spirits), spiritualism (an appeal to gods or communion with ancestor spirits); shamanism (the vesting of an individual with mystic powers); and divination (magically obtaining the truth). The field of medical anthropology examines the ways in which culture and society are organized around or impacted by issues of health, health care and related issues.
Early records on medicine have been discovered from ancient Egyptian medicine, Babylonian medicine, Ayurvedic medicine (in the Indian subcontinent), classical Chinese medicine (predecessor to the modern traditional Chinese Medicine), and ancient Greek medicine and Roman medicine. The Egyptian Imhotep (3rd millennium BC) is the first physician in history known by name. Earliest records of dedicated hospitals come from Mihintale in Sri Lanka where evidence of dedicated medicinal treatment facilities for patients are found.[4][5] The Indian surgeon Sushruta described numerous surgical operations, including the earliest forms of plastic surgery.[6][7]
The Greek physician Hippocrates, considered the "father of medicine",[9][10] laid the foundation for a rational approach to medicine. Hippocrates introduced the Hippocratic Oath for physicians, which is still relevant and in use today, and was the first to categorize illnesses as acute, chronic, endemic and epidemic, and use terms such as, "exacerbation, relapse, resolution, crisis, paroxysm, peak, and convalescence".[11][12] The Greek physician Galen was also one of the greatest surgeons of the ancient world and performed many audacious operations, including brain and eye surgeries. After the fall of the Western Roman Empire and the onset of the Dark Ages, the Greek tradition of medicine went into decline in Western Europe, although it continued uninterrupted in the Eastern Roman (Byzantine) Empire.
After 750 CE, the Muslim Arab world had the works of Hippocrates, Galen and Sushruta translated into Arabic, and Islamic physicians engaged in some significant medical research. Notable Islamic medical pioneers include the polymath, Avicenna, who, along with Imhotep and Hippocrates, has also been called the "father of medicine".[13][14] He wrote The Canon of Medicine, considered one of the most famous books in the history of medicine.[15] Others include Abulcasis,[16] Avenzoar,[17] Ibn al-Nafis,[18] and Averroes.[19] Rhazes [20] was one of first to question the Greek theory of humorism, which nevertheless remained influential in both medieval Western and medieval Islamic medicine.[21] The Islamic Bimaristan hospitals were an early example of public hospitals.[22][23] However, the fourteenth and fifteenth century Black Death was just as devastating to the Middle East as to Europe, and it has even been argued that Western Europe was generally more effective in recovering from the pandemic than the Middle East.[24] In the early modern period, important early figures in medicine and anatomy emerged in Europe, including Gabriele Falloppio and William Harvey.
The major shift in medical thinking was the gradual rejection, especially during the Black Death in the 14th and 15th centuries, of what may be called the 'traditional authority' approach to science and medicine. This was the notion that because some prominent person in the past said something must be so, then that was the way it was, and anything one observed to the contrary was an anomaly (which was paralleled by a similar shift in European society in general - see Copernicus's rejection of Ptolemy's theories on astronomy). Physicians like Ibn al-Nafis and Vesalius improved upon or disproved some of the theories from the past.
Modern scientific biomedical research (where results are testable and reproducible) began to replace early Western traditions based on herbalism, the Greek "four humours" and other such pre-modern notions. The modern era really began with Edward Jenner's discovery of the smallpox vaccine at the end of the 18th century (inspired by the method of inoculation earlier practiced in Asia), Robert Koch's discoveries around 1880 of the transmission of disease by bacteria, and then the discovery of antibiotics around 1900. The post-18th century modernity period brought more groundbreaking researchers from Europe. From Germany and Austrian doctors (such as Rudolf Virchow, Wilhelm Conrad Röntgen, Karl Landsteiner, and Otto Loewi) made contributions. In the United Kingdom Alexander Fleming, Joseph Lister, Francis Crick, and Florence Nightingale are considered important. From New Zealand and Australia came Maurice Wilkins, Howard Florey, and Frank Macfarlane Burnet). In the United States William Williams Keen, Harvey Cushing, William Coley, James D. Watson, Italy (Salvador Luria), Switzerland (Alexandre Yersin), Japan (Kitasato Shibasaburo), and France (Jean-Martin Charcot, Claude Bernard, Paul Broca and others did significant work). Russian Nikolai Korotkov also did significant work, as did Sir William Osler and Harvey Cushing.
As science and technology developed, medicine became more reliant upon medications. Throughout history and in Europe right until the late 18th century not only animal and plant products were used as medicine, but also human body parts and fluids.[25] Pharmacology developed from herbalism and many drugs are still derived from plants (atropine, ephedrine, warfarin, aspirin, digoxin, vinca alkaloids, taxol, hyoscine, etc.). The first of these was arsphenamine / Salvarsan discovered by Paul Ehrlich in 1908 after he observed that bacteria took up toxic dyes that human cells did not. Vaccines were discovered by Edward Jenner and Louis Pasteur. The first major class of antibiotics was the sulfa drugs, derived by French chemists originally from azo dyes. This has become increasingly sophisticated; modern biotechnology allows drugs targeted towards specific physiological processes to be developed, sometimes designed for compatibility with the body to reduce side-effects. Genomics and knowledge of human genetics is having some influence on medicine, as the causative genes of most monogenic genetic disorders have now been identified, and the development of techniques in molecular biology and genetics are influencing medical technology, practice and decision-making.
Evidence-based medicine is a contemporary movement to establish the most effective algorithms of practice (ways of doing things) through the use of systematic reviews and meta-analysis. The movement is facilitated by modern global information science, which allows as much of the available evidence as possible to be collected and analyzed according to standard protocols which are then disseminated to healthcare providers. One problem with this 'best practice' approach is that it could be seen to stifle novel approaches to treatment. The Cochrane Collaboration leads this movement. A 2001 review of 160 Cochrane systematic reviews revealed that, according to two readers, 21.3% of the reviews concluded insufficient evidence, 20% concluded evidence of no effect, and 22.5% concluded positive effect.[26]
In clinical practice doctors personally assess patients in order to diagnose, treat, and prevent disease using clinical judgment. The doctor-patient relationship typically begins an interaction with an examination of the patient's medical history and medical record, followed a medical interview[27] and a physical examination. Basic diagnostic medical devices (e.g. stethoscope, tongue depressor) are typically used. After examination for signs and interviewing for symptoms, the doctor may order medical tests (e.g. blood tests), take a biopsy, or prescribe pharmaceutical drugs or other therapies. Differential diagnosis methods help to rule out conditions based on the information provided. During the encounter, properly informing the patient of all relevant facts is an important part of the relationship and the development of trust. The medical encounter is then documented in the medical record, which is a legal document in many jurisdictions.[28] Followups may be shorter but follow the same general procedure.
The components of the medical interview[27] and encounter are:
The physical examination is the examination of the patient looking for signs of disease ('Symptoms' are what the patient volunteers, 'Signs' are what the healthcare provider detects by examination). The healthcare provider uses the senses of sight, hearing, touch, and sometimes smell (e.g. in infection, uremia, diabetic ketoacidosis). Taste has been made redundant by the availability of modern lab tests. Four actions are taught as the basis of physical examination: inspection, palpation (feel), percussion (tap to determine resonance characteristics), and auscultation (listen). This order may be modified depending on the main focus of the examination (e.g. a joint may be examined by simply "look, feel, move". Having this set order is an educational tool that encourages the practitioner to be systematic in their approach and refrain from using tools such as the stethoscope before they have fully evaluated the other modalities.
The clinical examination involves study of:
It is likely to be focussed on areas of interest highlighted in the medical history and may not include everything listed above.
Laboratory and imaging studies results may be obtained, if necessary.
The medical decision-making (MDM) process involves analysis and synthesis of all the above data to come up with a list of possible diagnoses (the differential diagnoses), along with an idea of what needs to be done to obtain a definitive diagnosis that would explain the patient's problem.
The treatment plan may include ordering additional laboratory tests and studies, starting therapy, referral to a specialist, or watchful observation. Follow-up may be advised.
This process is used by primary care providers as well as specialists. It may take only a few minutes if the problem is simple and straightforward. On the other hand, it may take weeks in a patient who has been hospitalized with bizarre symptoms or multi-system problems, with involvement by several specialists.
On subsequent visits, the process may be repeated in an abbreviated manner to obtain any new history, symptoms, physical findings, and lab or imaging results or specialist consultations.
Contemporary medicine is in general conducted within health care systems. Legal, credentialing and financing frameworks are established by individual governments, augmented on occasion by international organizations. The characteristics of any given health care system have significant impact on the way medical care is provided.
Advanced industrial countries (with the exception of the United States) [29][30] and many developing countries provide medical services through a system of universal health care which aims to guarantee care for all through a single-payer health care system, or compulsory private or co-operative health insurance. This is intended to ensure that the entire population has access to medical care on the basis of need rather than ability to pay. Delivery may be via private medical practices or by state-owned hospitals and clinics, or by charities; most commonly by a combination of all three.
Most tribal societies, but also some communist countries (e.g. China) and the United States,[29][30] provide no guarantee of health care for the population as a whole. In such societies, health care is available to those that can afford to pay for it or have self insured it (either directly or as part of an employment contract) or who may be covered by care financed by the government or tribe directly.
Transparency of information is another factor defining a delivery system. Access to information on conditions, treatments, quality and pricing greatly affects the choice by patients / consumers and therefore the incentives of medical professionals. While the US health care system has come under fire for lack of openness,[31] new legislation may encourage greater openness. There is a perceived tension between the need for transparency on the one hand and such issues as patient confidentiality and the possible exploitation of information for commercial gain on the other.
Provision of medical care is classified into primary, secondary and tertiary care categories.
Primary care medical services are provided by physicians, physician assistants, nurse practitioners, or other health professionals who have first contact with a patient seeking medical treatment or care. These occur in physician offices, clinics, nursing homes, schools, home visits and other places close to patients. About 90% of medical visits can be treated by the primary care provider. These include treatment of acute and chronic illnesses, preventive care and health education for all ages and both sexes.
Secondary care medical services are provided by medical specialists in their offices or clinics or at local community hospitals for a patient referred by a primary care provider who first diagnosed or treated the patient. Referrals are made for those patients who required the expertise or procedures performed by specialists. These include both ambulatory care and inpatient services, emergency rooms, intensive care medicine, surgery services, physical therapy, labor and delivery, endoscopy units, diagnostic laboratory and medical imaging services, hospice centers, etc. Some primary care providers may also take care of hospitalized patients and deliver babies in a secondary care setting.
Tertiary care medical services are provided by specialist hospitals or regional centers equipped with diagnostic and treatment facilities not generally available at local hospitals. These include trauma centers, burn treatment centers, advanced neonatology unit services, organ transplants, high-risk pregnancy, radiation oncology, etc.
Modern medical care also depends on information - still delivered in many health care settings on paper records, but increasingly nowadays by electronic means.
Working together as an interdisciplinary team, many highly trained health professionals besides medical practitioners are involved in the delivery of modern health care. Examples include: nurses, emergency medical technicians and paramedics, laboratory scientists, pharmacists, physiotherapists, respiratory therapists, speech therapists, occupational therapists, radiographers, dietitians and bioengineers.
The scope and sciences underpinning human medicine overlap many other fields. Dentistry, while a separate discipline from medicine, is considered a medical field.
A patient admitted to hospital is usually under the care of a specific team based on their main presenting problem, e.g. the Cardiology team, who then may interact with other specialties, e.g. surgical, radiology, to help diagnose or treat the main problem or any subsequent complications / developments.
Physicians have many specializations and subspecializations into certain branches of medicine, which are listed below. There are variations from country to country regarding which specialties certain subspecialties are in.
The main branches of medicine used in Wikipedia are:
In the broadest meaning of "medicine", there are many different specialties. In the UK most specialities will have their own body or college (collectively known as the Royal Colleges, although currently not all use the term "Royal") which have their own entrance exam. The development of a speciality is often driven by new technology (such as the development of effective anaesthetics) or ways of working (e.g. emergency departments) which leads to the desire to form a unifying body of doctors and thence the prestige of administering their own exam.
Within medical circles, specialities usually fit into one of two broad categories: "Medicine" and "Surgery." "Medicine" refers to the practice of non-operative medicine, and most subspecialties in this area require preliminary training in "Internal Medicine". In the UK this would traditionally have been evidenced by obtaining the MRCP (An exam allowing Membership of the Royal College of Physicians or the equivalent college in Scotland or Ireland). "Surgery" refers to the practice of operative medicine, and most subspecialties in this area require preliminary training in "General Surgery." (In the UK: Membership of the Royal College of Surgeons of England (MRCS).)There are some specialties of medicine that at the present time do not fit easily into either of these categories, such as radiology, pathology, or anesthesia. Most of these have branched from one or other of the two camps above - for example anaesthesia developed first as a faculty of the Royal College of Surgeons (for which MRCS/FRCS would have been required) before becoming the Royal College of Anaesthetists and membership of the college is by sitting the FRCA (Fellowship of the Royal College of Anesthetists).
Surgical specialties employ operative treatment. In addition, surgeons must decide when an operation is necessary, and also treat many non-surgical issues, particularly in the surgical intensive care unit (SICU), where a variety of critical issues arise. Surgery has many subspecialties, e.g. general surgery, cardiovascular surgery, colorectal surgery, neurosurgery, maxillofacial surgery, orthopedic surgery, otolaryngology, plastic surgery, oncologic surgery, transplant surgery, trauma surgery, urology, vascular surgery, and pediatric surgery. In some centers, anesthesiology is part of the division of surgery (for historical and logistical reasons), although it is not a surgical discipline.
Surgical training in the U.S. requires a minimum of five years of residency after medical school. Sub-specialties of surgery often require seven or more years. In addition, fellowships can last an additional one to three years. Because post-residency fellowships can be competitive, many trainees devote two additional years to research. Thus in some cases surgical training will not finish until more than a decade after medical school. Furthermore, surgical training can be very difficult and time consuming.
Internal medicine is the medical specialty concerned with the diagnosis, management and nonsurgical treatment of unusual or serious diseases, either of one particular organ system or of the body as a whole. According to some sources, an emphasis on internal structures is implied.[32] In North America, specialists in internal medicine are commonly called "internists". Elsewhere, especially in Commonwealth nations, such specialists are often called physicians.[33] These terms, internist or physician (in the narrow sense, common outside North America), generally exclude practitioners of gynecology and obstetrics, pathology, psychiatry, and especially surgery and its subspecialities.
Because their patients are often seriously ill or require complex investigations, internists do much of their work in hospitals. Formerly, many internists were not subspecialized; such general physicians would see any complex nonsurgical problem; this style of practice has become much less common. In modern urban practice, most internists are subspecialists: that is, they generally limit their medical practice to problems of one organ system or to one particular area of medical knowledge. For example, gastroenterologists and nephrologists specialize respectively in diseases of the gut and the kidneys.[34]
In Commonwealth and some other countries, specialist pediatricians and geriatricians are also described as specialist physicians (or internists) who have subspecialized by age of patient rather than by organ system. Elsewhere, especially in North America, general pediatrics is often a form of Primary care.
There are many subspecialities (or subdisciplines) of internal medicine:
Training in internal medicine (as opposed to surgical training), varies considerably across the world: see the articles on Medical education and Physician for more details. In North America, it requires at least three years of residency training after medical school, which can then be followed by a one to three year fellowship in the subspecialties listed above. In general, resident work hours in medicine are less than those in surgery, averaging about 60 hours per week in the USA. This difference does not apply in the UK where all doctors are now required by law to work less than 48 hours per week on average.
The followings are some major medical specialties that do not directly fit into any of the above mentioned groups.
Some interdisciplinary sub-specialties of medicine include:
Medical education and training varies around the world. It typically involves entry level education at a university medical school, followed by a period of supervised practice or internship, and/or residency. This can be followed by postgraduate vocational training. A variety of teaching methods have been employed in medical education, still itself a focus of active research.
Many regulatory authorities require continuing medical education, since knowledge, techniques and medical technology continue to evolve at a rapid rate.
In most countries, it is a legal requirement for a medical doctor to be licensed or registered. In general, this entails a medical degree from a university and accreditation by a medical board or an equivalent national organization, which may ask the applicant to pass exams. This restricts the considerable legal authority of the medical profession to physicians that are trained and qualified by national standards. It is also intended as an assurance to patients and as a safeguard against charlatans that practice inadequate medicine for personal gain. While the laws generally require medical doctors to be trained in "evidence based", Western, or Hippocratic Medicine, they are not intended to discourage different paradigms of health.
Doctors who are negligent or intentionally harmful in their care of patients can face charges of medical malpractice and be subject to civil, criminal, or professional sanctions.
The Catholic social theorist Ivan Illich subjected contemporary western medicine to detailed attack in his Medical Nemesis, first published in 1975. He argued that the medicalization in recent decades of so many of life's vicissitudes — birth and death, for example — frequently caused more harm than good and rendered many people in effect lifelong patients. He marshalled a body of statistics to show what he considered the shocking extent of post-operative side-effects and drug-induced illness in advanced industrial society. He was the first to introduce to a wider public the notion of iatrogenesis.[35] Others have since voiced similar views, but none so trenchantly, perhaps, as Illich.[36]
Through the course of the twentieth century, healthcare providers focused increasingly on the technology that was enabling them to make dramatic improvements in patients' health. The ensuing development of a more mechanistic, detached practice, with the perception of an attendant loss of patient-focused care, known as the medical model of health, led to criticisms that medicine was neglecting a holistic model. The inability of modern medicine to properly address some common complaints continues to prompt many people to seek support from alternative medicine. Although most alternative approaches lack scientific validation, some, notably acupuncture for some conditions and certain herbs, are backed by evidence.[37]
Medical errors and overmedication are also the focus of complaints and negative coverage. Practitioners of human factors engineering believe that there is much that medicine may usefully gain by emulating concepts in aviation safety, where it is recognized that it is dangerous to place too much responsibility on one "superhuman" individual and expect him or her not to make errors. Reporting systems and checking mechanisms are becoming more common in identifying sources of error and improving practice. Clinical versus statistical, algorithmic diagnostic methods were famously examined in psychiatric practice in a 1954 book by Paul E. Meehl, which controversially found statistical methods superior.[38] A 2000 meta-analysis comparing these methods in both psychology and medicine found that statistical or "mechanical" diagnostic methods were generally, although not always, superior.[38]
Disparities in quality of care given are often an additional cause of controversy.[39] For example, elderly mentally ill patients received poorer care during hospitalization in a 2008 study.[40] Rural poor African-American men were used in a study of syphilis that denied them basic medical care.
The highest honor awarded in medicine is the Nobel Prize in Medicine, awarded since 1901 by the Royal Swedish Academy of Sciences.
|
|